Definitions and Orthogonality
To understand the structure of a matrix, we must first define what it means for subspaces to be perpendicular. It is a much stricter condition than simple vector orthogonality.
- Subspace Orthogonality: Two subspaces $V$ and $W$ of a vector space are orthogonal if every vector $v$ in $V$ is perpendicular to every vector $w$ in $W$. Formally: $v^T w = 0$ for all $v \in V$ and all $w \in W$.
- The Orthogonal Complement ($V^\perp$): The orthogonal complement of a subspace $V$ contains every vector that is perpendicular to $V$. It is denoted as $V^\perp$ (pronounced "V perp").
The Fundamental Theorem of Orthogonality
The core identity of linear algebra connects the matrix action to the geometry of its spaces:
If $x$ is in the nullspace $N(A)$, then $Ax = 0$. This means the dot product of every row of $A$ with $x$ is zero. Since the row space $C(A^T)$ is spanned by those rows, every vector in the row space must be perpendicular to $x$.
$$x^T(A^T y) = (Ax)^T y = 0^T y = 0$$
This leads to the beautiful balance of dimensions. In $\mathbb{R}^n$, the dimensions always complement each other: $\dim(C(A^T)) + \dim(N(A)) = n$. Similarly, in $\mathbb{R}^m$, we have $\dim(C(A)) + \dim(N(A^T)) = m$.
The Fredholm Alternative
A structural duality exists where exactly one of these problems has a solution:
- $Ax = b$: The vector $b$ is in the column space.
- $A^T y = 0$ with $y^T b = 1$: $b$ has a component in the left nullspace, making the system inconsistent.